Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs
Job Summary
The Data Analyst is required to Data capture, data cleaning, data preparation, data management and data analysis and interpretations required by the Business. Handle data factory with precision and confidentiality. The role is crucial for providing actionable insights to improve teaching quality, enhance student performance, optimize business operations, and drive growth.
Essential Job Responsibilities:
Academic Data Analysis
· Analyze student performance data from internal assessments, mock tests, and board exams.
· Identify trends in subject-wise performance, batch-wise progress, and dropout patterns.
· Generate reports to assist academic heads in making data-driven interventions.
· Predict outcomes of upcoming competitive exams (JEE/NEET) based on historical data.
Business & Marketing Intelligence
· Monitor enrollment trends, inquiry-to-admission conversions, and campaign ROI.
· Provide insights into market behavior, location-wise performance, and competitor benchmarking.
· Analyze fee structures, discounts, and scholarship schemes for profitability and optimization.
Operational & Centre Efficiency
· Evaluate the performance of branches/centres using KPIs like retention rate, attendance, student satisfaction, and staff productivity.
· Support planning for new center openings with predictive enrollment data and location-based analytics.
Technology & Automation
· Build and maintain dashboards using BI tools (Power BI, Tableau, Excel, Google Data Studio).
· Work closely with IT or CRM teams to ensure data accuracy and integrity across systems.
· Automate regular reporting processes to reduce manual workload.
Strategic Decision Support
· Support senior management with forecasting models, statistical reports, and scenario simulations.
· Present insights in the form of structured reports, visual dashboards, or presentations.
Supervisory Responsibilities
Number of subordinate supervisors reporting to this job
N/A
Total number of employees supervised; include those directly supervised and those supervised through subordinate supervisors
N/A
Job Qualifications:
Education
· Bachelor’s or Master’s degree in Statistics, Mathematics, Engineering, Data Science, Computer Science, or related field.
Experience
· Must be a highly-skilled with 3+ years of experience.
· Demonstrate ability to successfully process all work types
Knowledge / Skills
· Analytical skills & Critical Thinking
· Data Visualization
· Microsoft Excel, VBA, R , Python, Tableau, Power BI
· Efficient in AI powered Data analysis tools
Licenses / Certifications
· Data Analytics desirable
Working Conditions:
Virtual environment, extended length of time in sitting position, medium stress environment
Physical Demands:
N/A
Duration : Full Time
Location : Vishakhapatnam, Bangalore, Chennai
years of experience : 3+ years
Job Description :
- 3+ Years of working as a Data Engineer with thorough understanding of data frameworks that collect, manage, transform and store data that can derive business insights.
- Strong communications (written and verbal) along with being a good team player.
- 2+ years of experience within the Big Data ecosystem (Hadoop, Sqoop, Hive, Spark, Pig, etc.)
- 2+ years of strong experience with SQL and Python (Data Engineering focused).
- Experience with GCP Data Services such as BigQuery, Dataflow, Dataproc, etc. is an added advantage and preferred.
- Any prior experience in ETL tools such as DataStage, Informatica, DBT, Talend, etc. is an added advantage for the role.
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Our company is seeking to hire a skilled software developer to help with the development of our AI/ML platform.
Your duties will primarily revolve around building Platform by writing code in Scala, as well as modifying platform
to fix errors, work on distributed computing, adapt it to new cloud services, improve its performance, or upgrade
interfaces. To be successful in this role, you will need extensive knowledge of programming languages and the
software development life-cycle.
Responsibilities:
Analyze, design develop, troubleshoot and debug Platform
Writes code and guides other team membersfor best practices and performs testing and debugging of
applications.
Specify, design and implementminor changes to existing software architecture. Build highly complex
enhancements and resolve complex bugs. Build and execute unit tests and unit plans.
Duties and tasks are varied and complex, needing independent judgment. Fully competent in own area of
expertise
Experience:
The candidate should have about 2+ years of experience with design and development in Java/Scala. Experience in
algorithm, Distributed System, Data-structure, database and architectures of distributed System is mandatory.
Required Skills:
1. In-depth knowledge of Hadoop, Spark architecture and its componentssuch as HDFS, YARN and executor, cores and memory param
2. Knowledge of Scala/Java.
3. Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of
enterprise application design patterns.
4. Good knowledge of Unix/Linux.
5. Experience working on large-scale software projects
6. Keep an eye out for technological trends, open-source projects that can be used.
7. Knows common programming languages Frameworks
Technical
- Own and Lead solution architecture of engineering solutions from product conception to launch involving a variety of technologies including but not limited to Core Java, Hibernate, Spring, AWS, MySQL, Oracle, No-SQL, Angular, Bootstrap, Apache CXF, Apache Lucene, Web Services (REST AND SOAP), AJAX, Dimension modeling, machine learning, data analytics etc. and ensure that the implemented solutions align with and support business strategies and requirements.
-
Perform Technical feasibility; drive shaping up the product through its life cycle to ensure scalability, performance, security & compliance standards.
-
Drive technical issues to closure with problem solving skills and ready to do hands on work.
-
Collaborate with delivery, engineering, quality, support, experience, program, and
infrastructure partners to achieve goals.
-
Support new opportunities of growth and continuous improvement through lateral and
creative thinking
-
Stay updated of new technologies and of changes in technologies that affect back-end and
front-end web development
Management
-
Accountability for architecture, product excellence, decision-making, client communications,
and solution outcomes.
-
Foster, facilitate, and furnish, timely decision-making across a broad network of
stakeholders, delivery partners, and operational teams.
-
Coach and develop a team of outstanding individuals providing product engineering
development and support solutions.
-
Build strong relationships with key business stakeholders across multiple business units and
be their trusted advisor.
-
Help management establish processes and standards.
Skills & Qualifications Required:
-
**Should have scored 70% & above in 10th and 12th grade.
-
A Minimum of a Bachelor’s degree in Computer Science or related software engineering
discipline, or equivalent
-
15+ years in technical development and solution architecture for enterprise applications and
experience of working on full stack.
-
Strong self-driven individual who can lead in a fast paced complex environment,
demonstrate problem solving skills in varied situations and take decisions taking all
stakeholders along.
-
Thought leadership, curiosity of business and engineering processes and drive to keep in
pace with the new and emerging trends, technologies and innovations.
-
Ability to manage partnerships with all areas and members of the business as well as all
levels of the organization.
-
Excellent spoken and written communication with the ability to present complex ideas in a
clear, concise fashion to technical and non-technical audiences.
-
Ability to lead from the front, willingness to be hands-on to get down and work with team to
understand and resolve problems.
Good to have:
-
Experience building or managing distributed systems.
-
Knowledge of data analytics and machine learning.
-
Experience in the Fintech domain.
-
Experience with Agile/Scrum development methodologies
Senior Team Lead, Software Engineering (96386)
Role: Senior Team Lead
Skills: Has to be an expert in these -
- Java
- Microservices
- Hadoop
- People Management Skills.
Will be a plus if knowledge on -
AWS
Location: Bangalore India – North Gate.
The ideal candidate will be a BTech in Computer Science or an MCA well-versed in full stack development of business applications using PHP with MySQL and HTML as database and front end. Knowledge of other tech stacks is preferred as also understanding of MS Azure cloud environment. Familiarity with PowerBI will be useful.
-
This job is on Contrecting role Max $70/hr
Java or Scala web developer with 8 to 12 years of experience and strong fundamentals/proficiency in core technologies used for web development - HTML, CSS, JavaScript, Spring and Hibernate (to include relational database experience). -
Object oriented analysis and design patterns using Java/J2EE technologies,
-
Knowledge on Spring Framework, MVC architectures, ORM frameworks like Hibernate
-
Experience with Restful Web Services, data modeling
-
Strong experience in relational database design and development (preferably with Oracle) and understanding of NoSQL databases like HBase, Druid, Solr
-
Experience working with event/message-based communication platforms such as Kafka, ActiveMQ etc.,
-
Experience working with Hadoop technologies and Spark framework
-
Working proficiency in build and development tools (Maven, Gradle, Jenkins)
-
Experience with test frameworks like JUnit, Mockito
-
Experience in front end development using modern JavaScript frameworks and charting frameworks
REQUIREMENT:
- Previous experience of working in large scale data engineering
- 4+ years of experience working in data engineering and/or backend technologies with cloud experience (any) is mandatory.
- Previous experience of architecting and designing backend for large scale data processing.
- Familiarity and experience of working in different technologies related to data engineering – different database technologies, Hadoop, spark, storm, hive etc.
- Hands-on and have the ability to contribute a key portion of data engineering backend.
- Self-inspired and motivated to drive for exceptional results.
- Familiarity and experience working with different stages of data engineering – data acquisition, data refining, large scale data processing, efficient data storage for business analysis.
- Familiarity and experience working with different DB technologies and how to scale them.
RESPONSIBILITY:
- End to end responsibility to come up with data engineering architecture, design, development and then implementation of it.
- Build data engineering workflow for large scale data processing.
- Discover opportunities in data acquisition.
- Bring industry best practices for data engineering workflow.
- Develop data set processes for data modelling, mining and production.
- Take additional tech responsibilities for driving an initiative to completion
- Recommend ways to improve data reliability, efficiency and quality
- Goes out of their way to reduce complexity.
- Humble and outgoing - engineering cheerleaders.









